Skip to main content

Matrix Representations of Operators

In the previous sections, we introduced the state vector formalism of quantum mechanics, as well as some important results derived from it. In this section, we will explore how operators in quantum mechanics can be represented as matrices.

Table of Contents

Operators as Matrices

Recall from the completeness relation that the identity operator can be written as a sum of projection operators:

For an operator , we can apply this relation twice to obtain:

If the ket space is -dimensional, then the quantity in the middle, , has possible values. Furthermore, given all of these values, we can uniquely determine the operator . As such, to a certain extent, can be represented by these values. We can put these values in a matrix, where the -th row and -th column contains the value :

I, along with Sakurai, will use the notation to mean that is represented by the matrix on the right-hand side.

An important property of these matrices is as follows: Take and move to the left (and take its Hermitian adjoint) to get . Then, by the conjugate symmetry of the inner product, it is equal to .

As such, the -th element of is the complex conjugate of the -th element of . In other words, is the conjugate transpose of .

Matrix Multiplication

Given two operators and , we can multiply them to get a new operator . Now we shall show that the matrix representations with follow the same rules as matrix multiplication.

Normally, for two matrices and , the -th element of the product is given by:

Thus, we would want to show the following:

This is much easier to prove than it seems. Since , we can write:

Now, simply use the completeness relation between the two operators to insert a sum over :

which is exactly what we needed to show.

Kets as Column Vectors

Kets can also be represented as column vectors.

Recall that kets can be expanded into linear combinations of basis kets:

This is similar to the Euclidean case where a vector can be expanded as a linear combination of basis vectors:

In the Euclidean case, we say that is the -th component of . Likewise, we can say that is the -th component of . Then, we can represent as a column vector:

Next, suppose we apply an operator to to get a new ket . By the rules of matrix multiplication, we would expect the -th component of to be:

This is quite similar to the matrix multiplication rule we derived earlier. Simply set and apply the completeness relation again.

Bra Vectors as Row Vectors

Bra vectors can be represented as row vectors. To find out how, suppose we apply an operator to a bra to get a new bra . The inner product of with any base ket is, by the completeness relation:

The term is the -th element of the matrix representation of . Thus, the other part, , should be the -th component of . This suggests that can be represented as a row vector:

Since the inner product is conjugate symmetric, we can equivalently write this as:

The inner product of with a ket is then, by the rules of matrix multiplication:

which aligns with simply adding a completeness relation to the original expression.

Outer Products (and Tensor Products)

An outer product is a product of a ket and a bra in an order such that the ket is on the left. We have previously shown that outer products are not scalars (like inner products), but rather operators.

To build intuition, we borrow from Euclidean vectors. Suppose we take the product of (a ket) and (a bra). Below I use a trick to perform this multiplication outlined in the appendix.

So indeed, outer products are operators (matrices) that act on vectors.

In quantum mechanics, then, suppose we want to take the outer product . By matrix multiplication:

There is actually something deeper going on here. The outer product is a matrix that acts on vectors. That begs the question: is there a way to take the product of more complicated things to get another operator? For instance, perhaps we could take two matrices and somehow combine them to get a new operator. This action is known as the tensor product, denoted by . Sometimes, the outer product is written as . Virtually any tensor we usually use can be created by taking the tensor products of vectors and linear forms (or bra vectors, 1-forms, covectors, etc.). Tensor products will definitely show up later in the course. For instance, the vector space of a system of two particles is the tensor product of the vector spaces of the individual particles.

Summary and Next Steps

In this rather brief section, we have shown how the various objects in quantum mechanics can be represented as matrices. This is a crucial step in understanding how quantum mechanics can be formulated in terms of linear algebra.

Here are the key points to remember:

  • Operators in quantum mechanics can be represented as matrices. Specifically, the matrix elements of an operator are given by .
  • Kets can be represented as column vectors, and bras as row vectors.
  • Outer products are operators that act on vectors. They can be thought of as the tensor product of a ket and a bra.
  • Tensor products combine vectors and linear forms to create new operators.

In the next section, we continue our exploration of vectors and operators by looking at the change of basis.

References

  • J.J. Sakurai, "Modern Quantum Mechanics", section 1.3.

Appendix: Quick Trick for Matrix Multiplication

A quick trick to perform matrix multiplication is as follows: Given the product of two matrices , shift up and to the right, and then sum the products of the corresponding elements in the two matrices. To see what I mean, consider the following matrices:

To multiply them, we can shift up and to the right:

Then, the elements in the product matrix are given by the sum of the products of the corresponding elements. For example, the element in the first row and first column is:

This also easily allows one to find out whether the dimensions of the matrices are compatible for multiplication. For example, the following matrices cannot be multiplied:

The 's would require something from a third row in the first matrix, which is not present.